On the Vγ dimension for regression in Reproducing Kernel Hilbert Spaces
نویسندگان
چکیده
This paper presents a computation of the Vγ dimension for regression in bounded subspaces of Reproducing Kernel Hilbert Spaces (RKHS) for the Support Vector Machine (SVM) regression ǫ-insensitive loss function Lǫ, and general Lp loss functions. Finiteness of the Vγ dimension is shown, which also proves uniform convergence in probability for regression machines in RKHS subspaces that use the Lǫ or general Lp loss functions. This paper presents a novel proof of this result. It also presents a computation of an upper bound of the Vγ dimension under some conditions, that leads to an approach for the estimation of the empirical Vγ dimension given a set of training data.
منابع مشابه
Some Properties of Reproducing Kernel Banach and Hilbert Spaces
This paper is devoted to the study of reproducing kernel Hilbert spaces. We focus on multipliers of reproducing kernel Banach and Hilbert spaces. In particular, we try to extend this concept and prove some related theorems. Moreover, we focus on reproducing kernels in vector-valued reproducing kernel Hilbert spaces. In particular, we extend reproducing kernels to relative reproducing kernels an...
متن کاملFisher’s Linear Discriminant Analysis for Weather Data by reproducing kernel Hilbert spaces framework
Recently with science and technology development, data with functional nature are easy to collect. Hence, statistical analysis of such data is of great importance. Similar to multivariate analysis, linear combinations of random variables have a key role in functional analysis. The role of Theory of Reproducing Kernel Hilbert Spaces is very important in this content. In this paper we study a gen...
متن کاملSolving multi-order fractional differential equations by reproducing kernel Hilbert space method
In this paper we propose a relatively new semi-analytical technique to approximate the solution of nonlinear multi-order fractional differential equations (FDEs). We present some results concerning to the uniqueness of solution of nonlinear multi-order FDEs and discuss the existence of solution for nonlinear multi-order FDEs in reproducing kernel Hilbert space (RKHS). We further give an error a...
متن کاملQuantile Regression in Reproducing Kernel Hilbert Spaces
In this paper we consider quantile regression in reproducing kernel Hilbert spaces, which we refer to as kernel quantile regression (KQR). We make three contributions: (1) we propose an efficient algorithm that computes the entire solution path of the KQR, with essentially the same computational cost as fitting one KQR model; (2) we derive a simple formula for the effective dimension of the KQR...
متن کاملGradient-based kernel dimension reduction for regression
This paper proposes a novel approach to linear dimension reduction for regression using nonparametric estimation with positive definite kernels or reproducing kernel Hilbert spaces. The purpose of the dimension reduction is to find such directions in the explanatory variables that explain the response sufficiently: this is called sufficient dimension reduction. The proposed method is based on a...
متن کامل